Oracle complexity of second-order methods for smooth convex optimization
نویسندگان
چکیده
منابع مشابه
Oracle Complexity of Second-Order Methods for Smooth Convex Optimization
Second-order methods, which utilize gradients as well as Hessians to optimize a given function, are of major importance in mathematical optimization. In this work, we study the oracle complexity of such methods, or equivalently, the number of iterations required to optimize a function to a given accuracy. Focusing on smooth and convex functions, we derive (to the best of our knowledge) the firs...
متن کاملFirst-order methods of smooth convex optimization with inexact oracle
In this paper, we analyze different first-order methods of smooth convex optimization employing inexact first-order information. We introduce the notion of an approximate first-order oracle. The list of examples of such an oracle includes smoothing technique, Moreau-Yosida regularization, Modified Lagrangians, and many others. For different methods, we derive complexity estimates and study the ...
متن کاملOracle Complexity of Second-Order Methods for Finite-Sum Problems
Finite-sum optimization problems are ubiquitous in machine learning, and are commonly solved using first-order methods which rely on gradient computations. Recently, there has been growing interest in second-order methods, which rely on both gradients and Hessians. In principle, second-order methods can require much fewer iterations than first-order methods, and hold the promise for more effici...
متن کاملSemi-smooth Second-order Type Methods for Composite Convex Programs
The goal of this paper is to study approaches to bridge the gap between first-order and second-order type methods for composite convex programs. Our key observations are: i) Many well-known operator splitting methods, such as forward-backward splitting (FBS) and Douglas-Rachford splitting (DRS), actually define a possibly semi-smooth and monotone fixed-point mapping; ii) The optimal solutions o...
متن کاملIntermediate Gradient Methods for Smooth Convex Problems with Inexact Oracle
Between the robust but slow (primal or dual) gradient methods and the fast but sensitive to errors fast gradient methods, our goal in this paper is to develop first-order methods for smooth convex problems with intermediate speed and intermediate sensitivity to errors. We develop a general family of first-order methods, the Intermediate Gradient Method (IGM), based on two sequences of coefficie...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Programming
سال: 2018
ISSN: 0025-5610,1436-4646
DOI: 10.1007/s10107-018-1293-1